Neighbor Contrastive Learning on Learnable Graph Augmentation
نویسندگان
چکیده
Recent years, graph contrastive learning (GCL), which aims to learn representations from unlabeled graphs, has made great progress. However, the existing GCL methods mostly adopt human-designed augmentations, are sensitive various datasets. In addition, losses originally developed in computer vision have been directly applied data, where neighboring nodes regarded as negatives and consequently pushed far apart anchor. this is contradictory with homophily assumption of net-works that connected often belong same class should be close each other. work, we propose an end-to-end automatic method, named NCLA apply neighbor on learnable augmentation. Several augmented views adaptive topology automatically learned by multi-head attention mechanism, can compatible datasets without prior domain knowledge. a loss devised allow multiple positives per anchor taking network supervised signals. Both augmentations embeddings proposed NCLA. Extensive experiments benchmark demonstrate yields state-of-the-art node classification performance self-supervised even exceeds ones, when labels extremely limited. Our code released at https://github.com/shenxiaocam/NCLA.
منابع مشابه
On Contrastive Divergence Learning
Maximum-likelihood (ML) learning of Markov random fields is challenging because it requires estimates of averages that have an exponential number of terms. Markov chain Monte Carlo methods typically take a long time to converge on unbiased estimates, but Hinton (2002) showed that if the Markov chain is only run for a few steps, the learning can still work well and it approximately minimizes a d...
متن کاملContrastive Learning for Image Captioning
Image captioning, a popular topic in computer vision, has achieved substantial progress in recent years. However, the distinctiveness of natural descriptions is often overlooked in previous work. It is closely related to the quality of captions, as distinctive captions are more likely to describe images with their unique aspects. In this work, we propose a new learning method, Contrastive Learn...
متن کاملContrastive Learning and Neural Oscillations
The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that correspon...
متن کاملFaithful Contrastive Features in Learning
This article pursues the idea of inferring aspects of phonological underlying forms directly from surface contrasts by looking at optimality theoretic linguistic systems (Prince & Smolensky, 1993/2004). The main result proves that linguistic systems satisfying certain conditions have the faithful contrastive feature property: Whenever 2 distinct morphemes contrast on the surface in a particular...
متن کاملContrastive Learning Using Spectral Methods
In many natural settings, the analysis goal is not to characterize a single data set in isolation, but rather to understand the difference between one set of observations and another. For example, given a background corpus of news articles together with writings of a particular author, one may want a topic model that explains word patterns and themes specific to the author. Another example come...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2023
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v37i8.26168